Scalable conditional deep inverse Rosenblatt transports using tensor trains and gradient-based dimension reduction
نویسندگان
چکیده
We present a novel offline-online method to mitigate the computational burden of characterization posterior random variables in statistical learning. In offline phase, proposed learns joint law parameter and observable tensor-train (TT) format. online resulting order-preserving conditional transport can characterize given newly observed data real time. Compared with state-of-the-art normalizing flow techniques, relies on function approximation is equipped thorough performance analysis. The perspective also allows us further extend capability maps challenging problems high-dimensional observations parameters. On one hand, we heuristics reorder and/or reparametrize enhance power TT. other integrate TT-based reordering/reparametrization into layered compositions improve maps. demonstrate efficiency various learning tasks ordinary differential equations (ODEs) partial (PDEs).
منابع مشابه
Approximating Conditional Distribution Functions Using Dimension Reduction
Motivated by applications to prediction and forecasting, we suggest methods for approximating the conditional distribution function of a random variable Y given a dependent random d-vector X. The idea is to estimate not the distribution of Y |X, but that of Y |θX, where the unit vector θ is selected so that the approximation is optimal under a least-squares criterion. We show that θ may be esti...
متن کاملTensor sufficient dimension reduction.
Tensor is a multiway array. With the rapid development of science and technology in the past decades, large amount of tensor observations are routinely collected, processed, and stored in many scientific researches and commercial activities nowadays. The colorimetric sensor array (CSA) data is such an example. Driven by the need to address data analysis challenges that arise in CSA data, we pro...
متن کاملGradient-based kernel dimension reduction for regression
This paper proposes a novel approach to linear dimension reduction for regression using nonparametric estimation with positive definite kernels or reproducing kernel Hilbert spaces. The purpose of the dimension reduction is to find such directions in the explanatory variables that explain the response sufficiently: this is called sufficient dimension reduction. The proposed method is based on a...
متن کاملScalable High Performance Dimension Reduction
Dimension reduction is a useful tool for visualization of such high-dimensional data to make data analysis feasible for such vast volume and high-dimensional scientific data. Among the known dimension reduction algorithms, multidimensional scaling algorithm is investigated in this proposal due to its theoretical robustness and high applicability. Multidimensional scaling is known as a non-linea...
متن کاملApproximating Conditional Distribution Functions Using Dimension Reduction by Peter Hall
Motivated by applications to prediction and forecasting, we suggest methods for approximating the conditional distribution function of a random variable Y given a dependent random d-vector X. The idea is to estimate not the distribution of Y |X, but that of Y |θTX, where the unit vector θ is selected so that the approximation is optimal under a least-squares criterion. We show that θ may be est...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Computational Physics
سال: 2023
ISSN: ['1090-2716', '0021-9991']
DOI: https://doi.org/10.1016/j.jcp.2023.112103